28 research outputs found

    SciRE-Solver: Efficient Sampling of Diffusion Probabilistic Models by Score-integrand Solver with Recursive Derivative Estimation

    Full text link
    Diffusion probabilistic models (DPMs) are a powerful class of generative models known for their ability to generate high-fidelity image samples. A major challenge in the implementation of DPMs is the slow sampling process. In this work, we bring a high-efficiency sampler for DPMs. Specifically, we propose a score-based exact solution paradigm for the diffusion ODEs corresponding to the sampling process of DPMs, which introduces a new perspective on developing numerical algorithms for solving diffusion ODEs. To achieve an efficient sampler, we propose a recursive derivative estimation (RDE) method to reduce the estimation error. With our proposed solution paradigm and RDE method, we propose the score-integrand solver with the convergence order guarantee as efficient solver (SciRE-Solver) for solving diffusion ODEs. The SciRE-Solver attains state-of-the-art (SOTA) sampling performance with a limited number of score function evaluations (NFE) on both discrete-time and continuous-time DPMs in comparison to existing training-free sampling algorithms. Such as, we achieve 3.483.48 FID with 1212 NFE and 2.422.42 FID with 2020 NFE for continuous-time DPMs on CIFAR10, respectively. Different from other samplers, SciRE-Solver has the promising potential to surpass the FIDs achieved in the original papers of some pre-trained models with a small NFEs. For example, we reach SOTA value of 2.402.40 FID with 100100 NFE for continuous-time DPM and of 3.153.15 FID with 8484 NFE for discrete-time DPM on CIFAR-10, as well as of 2.172.17 (2.022.02) FID with 1818 (5050) NFE for discrete-time DPM on CelebA 64×\times64

    Neural Operator Variational Inference based on Regularized Stein Discrepancy for Deep Gaussian Processes

    Full text link
    Deep Gaussian Process (DGP) models offer a powerful nonparametric approach for Bayesian inference, but exact inference is typically intractable, motivating the use of various approximations. However, existing approaches, such as mean-field Gaussian assumptions, limit the expressiveness and efficacy of DGP models, while stochastic approximation can be computationally expensive. To tackle these challenges, we introduce Neural Operator Variational Inference (NOVI) for Deep Gaussian Processes. NOVI uses a neural generator to obtain a sampler and minimizes the Regularized Stein Discrepancy in L2 space between the generated distribution and true posterior. We solve the minimax problem using Monte Carlo estimation and subsampling stochastic optimization techniques. We demonstrate that the bias introduced by our method can be controlled by multiplying the Fisher divergence with a constant, which leads to robust error control and ensures the stability and precision of the algorithm. Our experiments on datasets ranging from hundreds to tens of thousands demonstrate the effectiveness and the faster convergence rate of the proposed method. We achieve a classification accuracy of 93.56 on the CIFAR10 dataset, outperforming SOTA Gaussian process methods. Furthermore, our method guarantees theoretically controlled prediction error for DGP models and demonstrates remarkable performance on various datasets. We are optimistic that NOVI has the potential to enhance the performance of deep Bayesian nonparametric models and could have significant implications for various practical application

    Double Normalizing Flows: Flexible Bayesian Gaussian Process ODEs Learning

    Full text link
    Recently, Gaussian processes have been utilized to model the vector field of continuous dynamical systems. Bayesian inference for such models \cite{hegde2022variational} has been extensively studied and has been applied in tasks such as time series prediction, providing uncertain estimates. However, previous Gaussian Process Ordinary Differential Equation (ODE) models may underperform on datasets with non-Gaussian process priors, as their constrained priors and mean-field posteriors may lack flexibility. To address this limitation, we incorporate normalizing flows to reparameterize the vector field of ODEs, resulting in a more flexible and expressive prior distribution. Additionally, due to the analytically tractable probability density functions of normalizing flows, we apply them to the posterior inference of GP ODEs, generating a non-Gaussian posterior. Through these dual applications of normalizing flows, our model improves accuracy and uncertainty estimates for Bayesian Gaussian Process ODEs. The effectiveness of our approach is demonstrated on simulated dynamical systems and real-world human motion data, including tasks such as time series prediction and missing data recovery. Experimental results indicate that our proposed method effectively captures model uncertainty while improving accuracy

    Underwater image quality assessment: subjective and objective methods

    Get PDF
    Underwater image enhancement plays a critical role in marine industry. Various algorithms are applied to enhance underwater images, but their performance in terms of perceptual quality has been little studied. In this paper, we investigate five popular enhancement algorithms and their output image quality. To this end, we have created a benchmark, including images enhanced by different algorithms and ground truth image quality obtained by human perception experiments. We statistically analyse the impact of various enhancement algorithms on the perceived quality of underwater images. Also, the visual quality provided by these algorithms is evaluated objectively, aiming to inform the development of objective metrics for automatic assessment of the quality for underwater image enhancement. The image quality benchmark and its objective metric are made publicly available

    Convex optimization method for quantifying image quality induced saliency variation

    Get PDF
    Visual saliency plays a significant role in image quality assessment. Image distortions cause shift of saliency from its original places. Being able to measure such distortion-included saliency variation (DSV) contributes towards the optimal use of saliency in automated image quality assessment. In our previous study a benchmark for the measurement of DSV through subjective testing was built. However, exiting saliency similarity measures are unhelpful for the quantification of DSV due to the fact that DSV highly depends on the dispersion degree of a saliency map. In this paper, we propose a novel similarity metric for the measurement of DSV, namely MDSV, based on convex optimization method. The proposed MDSV metric integrates the local saliency similarity measure and the global saliency similarity measure using the function of saliency dispersion as a modulator. We detail the parameter selection of the proposed metric and the interactions of sub-models for the convex optimization strategy. Statistical analyses show that our proposed MDSV outperforms the existing metrics in quantifying the image quality induced saliency variation

    An underwater image quality assessment metric

    Get PDF
    Various image enhancement algorithms are adopted to improve underwater images that often suffer from visual distortions. It is critical to assess the output quality of underwater images undergoing enhancement algorithms, and use the results to optimise underwater imaging systems. In our previous study, we created a benchmark for quality assessment of underwater image enhancement via subjective experiments. Building on the benchmark, this paper proposes a new objective metric that can automatically assess the output quality of image enhancement, namely UWEQM. By characterising specific underwater physics and relevant properties of the human visual system, image quality attributes are computed and combined to yield an overall metric. Experimental results show that the proposed UWEQM metric yields good performance in predicting image quality as perceived by human subjects
    corecore